eigenvectors

eigenvectors
n. characteristic vector, vector which under a certain linear transformation produces a scalar multiple of the original vector (Mathematics)

English contemporary dictionary. 2014.

Игры ⚽ Поможем сделать НИР

Look at other dictionaries:

  • Eigenvalues and eigenvectors — For more specific information regarding the eigenvalues and eigenvectors of matrices, see Eigendecomposition of a matrix. In this shear mapping the red arrow changes direction but the blue arrow does not. Therefore the blue arrow is an… …   Wikipedia

  • Eigenvalue, eigenvector and eigenspace — In mathematics, given a linear transformation, an Audio|De eigenvector.ogg|eigenvector of that linear transformation is a nonzero vector which, when that transformation is applied to it, changes in length, but not direction. For each eigenvector… …   Wikipedia

  • Eigendecomposition of a matrix — In the mathematical discipline of linear algebra, eigendecomposition or sometimes spectral decomposition is the factorization of a matrix into a canonical form, whereby the matrix is represented in terms of its eigenvalues and… …   Wikipedia

  • Principal component analysis — PCA of a multivariate Gaussian distribution centered at (1,3) with a standard deviation of 3 in roughly the (0.878, 0.478) direction and of 1 in the orthogonal direction. The vectors shown are the eigenvectors of the covariance matrix scaled by… …   Wikipedia

  • Discrete Fourier transform — Fourier transforms Continuous Fourier transform Fourier series Discrete Fourier transform Discrete time Fourier transform Related transforms In mathematics, the discrete Fourier transform (DFT) is a specific kind of discrete transform, used in… …   Wikipedia

  • Perron–Frobenius theorem — In linear algebra, the Perron–Frobenius theorem, proved by Oskar Perron (1907) and Georg Frobenius (1912), asserts that a real square matrix with positive entries has a unique largest real eigenvalue and that the corresponding… …   Wikipedia

  • Principal components analysis — Principal component analysis (PCA) is a vector space transform often used to reduce multidimensional data sets to lower dimensions for analysis. Depending on the field of application, it is also named the discrete Karhunen Loève transform (KLT),… …   Wikipedia

  • Eigenvalue algorithm — In linear algebra, one of the most important problems is designing efficient and stable algorithms for finding the eigenvalues of a matrix. These eigenvalue algorithms may also find eigenvectors. Contents 1 Characteristic polynomial 2 Power… …   Wikipedia

  • Compact operator on Hilbert space — In functional analysis, compact operators on Hilbert spaces are a direct extension of matrices: in the Hilbert spaces, they are precisely the closure of finite rank operators in the uniform operator topology. As such, results from matrix theory… …   Wikipedia

  • Diagonalizable matrix — In linear algebra, a square matrix A is called diagonalizable if it is similar to a diagonal matrix, i.e., if there exists an invertible matrix P such that P −1AP is a diagonal matrix. If V is a finite dimensional vector space, then a linear …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”